Supplement to “ A Generalized Least Squares Matrix Decomposition ”

نویسندگان

  • Genevera I. Allen
  • Logan Grosenick
  • Jonathan Taylor
چکیده

In addition to sparseness, there is much interest in penalties that encourage smoothness, especially in the context of functional data analysis. We show how the GPMF can be used with smooth penalties and propose a generalized gradient descent method to solve for these smooth GPMF factors. Many have proposed to obtain smoothness in the factors by using a quadratic penalty. Rice and Silverman (1991) suggested P (v) = v Ωv, where Ω is the matrix of squared second or fourth differences. As this penalty is not homogeneous of order one, we use the Ωnorm penalty: P (v) = (v Ωv)−1/2 = ||v ||Ω. Since this penalty is a norm or a semi-norm, the GPMF solution given in Theorem 2 can be employed. We seek to minimize the following Ω-norm penalized regression problem: 1 2 ||X Qu−v ||R + λv ||v ||Ω. Notice that this problem is similar in structure to the group lasso problem of Yuan and Lin (2006) with one group. To solve the Ω-norm penalized regression problem, we use a generalized gradient descent method. (We note that there are more elaborate first order solvers, introduced in recent works such as Becker et al. (2010, 2009), and we describe a simple version of such solvers). Suppose

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Generalized Least Squares Matrix Decomposition

Variables in high-dimensional data sets common in neuroimaging, spatial statistics, time series and genomics often exhibit complex dependencies that can arise, for example, from spatial and/or temporal processes or latent network structures. Conventional multivariate analysis techniques often ignore these relationships. We propose a generalization of the singular value decomposition that is app...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Least-Squares Solutions of the Matrix Equation AXA= B Over Bisymmetric Matrices and its Optimal Approximation

A real n × n symmetric matrix X = (x i j)n×n is called a bisymmetric matrix if x i j = xn+1− j,n+1−i . Based on the projection theorem, the canonical correlation decomposition and the generalized singular value decomposition, a method useful for finding the least-squares solutions of the matrix equation AXA= B over bisymmetric matrices is proposed. The expression of the least-squares solutions ...

متن کامل

Level choice in truncated total least squares

The method of truncated total least squares [2] is an alternative to the classical truncated singular value decomposition used for the regularization of ill-conditioned linear systems Ax ≈ b [3]. Truncation methods aim at limiting the contribution of noise or rounding errors by cutting off a certain number of terms in an expansion such as the singular value decomposition. To this end a truncati...

متن کامل

Penalized least squares versus generalized least squares representations of linear mixed models

The methods in the lme4 package for R for fitting linear mixed models are based on sparse matrix methods, especially the Cholesky decomposition of sparse positive-semidefinite matrices, in a penalized least squares representation of the conditional model for the response given the random effects. The representation is similar to that in Henderson’s mixed-model equations. An alternative represen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013